Optimize the grouping behavior when finding by large path#347
Optimize the grouping behavior when finding by large path#347nbekirov wants to merge 2 commits intoClosureTree:masterfrom
Conversation
|
That some pretty good optimization. Did you check the cost of the both queries ? |
|
@seuros, no actually. For me, it was enough that path size of 49 was not calling I may check the performance of the big SQL under Postgres for example if we want to reduce the |
|
Since you have the data already populated, can you append |
|
Sorry, already out office.
|
|
Now that I think of it, why would we even keep the if checking for “large
paths”. We can just always split in groups of size n. For n<50 it’d behave
entirely as it used to.
P.s. Is there any chance backporting this to 6.x? I noticed that 7 has some
new gem requirements that are preventing me to use the new version in my
current project.
|
|
@seuros, here's the promised data:
|
|
Can you add a Changelog entry to this PR ? Add version 7.1.0 |
|
Done :) |
|
Hello! Can I help with anything more on this one? |
Hello!
While testing how closure_tree is handling deep nesting for a project I noticed that at some point it started to issue an unreasonably large number of SQL queries. Looking at the
max_join_tables(currently 50) I assumed that for 55 levels of nesting there will be something like 2 queries and not 50!I tracked down the issue to the use of in_groups which splits the path array in "(the given) number of groups" and replaced it with in_groups_of which would rather split "in groups of (the given) size".
Check out the updated spec example SQL output: